Better copy/import - Mailing list pgsql-admin

From Steven Lane
Subject Better copy/import
Date
Msg-id v03007803b781d5ebe403@[65.15.153.184]
Whole thread Raw
In response to Re: error status 139  (Tom Lane <tgl@sss.pgh.pa.us>)
Responses Re: Better copy/import  (Gary Stainburn <gary.stainburn@ringways.co.uk>)
List pgsql-admin
Hello all:

Sorry for the bad subject line on the last version of this post.

I'm trying to load about 10M rows of data into a simple postgres table. The
data is straightforward and fairly clean, but does have glitches every few
tens of thousands of rows. My problem is that when COPY hits a bad row it
just aborts, leaving me to go back, delete or clean up the row and try
again.

Is there any way to import records that could just skip the bad ones and
notify me which ones they are? Loading this much data is pretty
time-consuming, especially when I keep having to repeat it to find each new
bad row. Is there a better way?

-- sgl



pgsql-admin by date:

Previous
From: lobet_romuald@my-deja.com (Romuald Lobet)
Date:
Subject: Re: What CASE tools and clients for Postgres?
Next
From: Steven Lane
Date:
Subject: Re: error status 139